Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Optimization of Transformer API #30957

Merged
merged 4 commits into from
Feb 23, 2021

Conversation

xiemoyuan
Copy link
Contributor

@xiemoyuan xiemoyuan commented Feb 8, 2021

PR types

Function optimization

PR changes

APIs

Describe

Support data type 'bool' and 'int' for attention mask in MultiHeadAttention.

@paddle-bot-old
Copy link

paddle-bot-old bot commented Feb 8, 2021

Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

python/paddle/nn/layer/transformer.py Outdated Show resolved Hide resolved
python/paddle/nn/layer/transformer.py Outdated Show resolved Hide resolved
python/paddle/nn/layer/transformer.py Outdated Show resolved Hide resolved
@paddle-bot-old
Copy link

Sorry to inform you that 222c07a's CIs have passed for more than 7 days. To prevent PR conflicts, you need to re-run all CIs manually.

@guoshengCS guoshengCS merged commit edacb62 into PaddlePaddle:develop Feb 23, 2021
@xiemoyuan xiemoyuan deleted the optimize_transformer_api branch February 24, 2021 08:58
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants